Boosting Algorithms for Maximizing the Soft Margin
نویسندگان
چکیده
Algorithm 1: SoftBoost 1. Input: S = 〈(x1, y1), . . . , (xN , yN )〉, desired accuracy δ, and capping parameter ν ∈ [1, N ]. 2. Initialize: dn to the uniform distribution 3.Do for t = 1, . . . (a) Train classifier on dt−1 and {u1, . . . ,ut−1} and obtain hypothesis ht. Set un = h(xn)yn. (b) Calculate the edge γt of ht : γt = dt · ut (c) Set γ̂t = (minm=1...t γm)− δ (d) Set γ∗ = solution to the primal linear programming problem. (e) Update
منابع مشابه
A Duality View of Boosting Algorithms
We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of AdaBoost, LogitBoost and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maxim...
متن کاملNon-convex boosting with minimum margin guarantees
Many classification algorithms achieve poor generalization accuracy on “noisy” data sets. We introduce a new non-convex boosting algorithm BrownBoost-δ, a noiseresistant booster, that is able to significantly increase accuracy on a set of noisy classification problems. Our algorithm consistently outperforms the original BrownBoost algorithm, AdaBoost, and LogitBoost on simulated and real data. ...
متن کاملSpeed and Sparsity of Regularized Boosting
Boosting algorithms with l1-regularization are of interest because l1 regularization leads to sparser composite classifiers. Moreover, Rosset et al. have shown that for separable data, standard lpregularized loss minimization results in a margin maximizing classifier in the limit as regularization is relaxed. For the case p = 1, we extend these results by obtaining explicit convergence bounds o...
متن کاملRegularizing AdaBoost
Boosting methods maximize a hard classiication margin and are known as powerful techniques that do not exhibit overrtting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth ts and overrtting. Therefore we propose three algorithms to allow for soft margin classiication by ...
متن کاملBoosting as a Regularized Path to a Maximum Margin Classifier
In this paper we study boosting methods from a new perspective. We build on recent work by Efron et al. to show that boosting approximately (and in some cases exactly) minimizes its loss criterion with an l1 constraint on the coefficient vector. This helps understand the success of boosting with early stopping as regularized fitting of the loss criterion. For the two most commonly used criteria...
متن کامل